Neural Architectures for Fine-grained Entity Type Classification

نویسندگان

  • Kentaro Inui
  • Sebastian Riedel
  • Pontus Stenetorp
  • Sonse Shimaoka
چکیده

In this work, we investigate several neural network architectures for fine-grained entity type classification. Particularly, we consider extensions to a recently proposed attentive neural architecture and make three key contributions. Previous work on attentive neural architectures do not consider hand-crafted features, we combine learnt and hand-crafted features and observe that they complement each other. Additionally, through quantitative analysis we establish that the attention mechanism is capable of learning to attend over syntactic heads and the phrase containing the mention, where both are known strong hand-crafted features for our task. We enable parameter sharing through a hierarchical label encoding method, that in low-dimensional projections show clear clusters for each type hierarchy. Lastly, despite using the same evaluation dataset, the literature frequently compare models trained using different data. We establish that the choice of training data has a drastic impact on performance, with decreases by as much as 9.85% loose micro F1 score for a previously proposed method. Despite this, our best model achieves state-of-the-art results with 75.36% loose micro F1 score on the wellestablished FIGER (GOLD) dataset.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Topic Information Based Neural Network Model for Fine-grained Entity Type Classification

Entity recognition is an important part of natural language processing, but nowadays most entity recognition systems are restricted to a limited set of entity classes (e.g., person, location, organization or miscellaneous). Therefore, fine-grained entity type classification becomes a hot issue to further study. This paper proposed a neural network model based on topic information for fine-grain...

متن کامل

An Attentive Neural Architecture for Fine-grained Entity Type Classification

In this work we propose a novel attentionbased neural network model for the task of fine-grained entity type classification that unlike previously proposed models recursively composes representations of entity mention contexts. Our model achieves state-of-theart performance with 74.94% loose micro F1score on the well-established FIGER dataset, a relative improvement of 2.59% . We also investiga...

متن کامل

End-to-End Trainable Attentive Decoder for Hierarchical Entity Classification

We address fine-grained entity classification and propose a novel attention-based recurrent neural network (RNN) encoderdecoder that generates paths in the type hierarchy and can be trained end-to-end. We show that our model performs better on fine-grained entity classification than prior work that relies on flat or local classifiers that do not directly model hierarchical structure.

متن کامل

Embedding Methods for Fine Grained Entity Type Classification

We propose a new approach to the task of fine grained entity type classifications based on label embeddings that allows for information sharing among related labels. Specifically, we learn an embedding for each label and each feature such that labels which frequently co-occur are close in the embedded space. We show that it outperforms state-of-the-art methods on two fine grained entity-classif...

متن کامل

Convolutional Low-Resolution Fine-Grained Classification

Successful fine-grained image classification methods learn subtle details between visually similar (sub-)classes, but the problem becomes significantly more challenging if the details are missing due to low resolution. Encouraged by the recent success of Convolutional Neural Network (CNN) architectures in image classification, we propose a novel resolution-aware deep model which combines convol...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017